Efficient MCMC sampling in dynamic mixture models

نویسندگان

  • Gabriele Fiorentini
  • Christophe Planas
  • Alessandro Rossi
چکیده

We show how to improve the efficiency of MCMC sampling in dynamic mixture models by block-sampling the discrete latent variable. Two algorithms are proposed: the first is a multi-move extension of the single-move Gibbs sampler devised by Gerlach, Carter and Kohn (2000); the second is an adaptive Metropolis-Hastings scheme that performs well even when the number of discrete states is large. Three empirical examples illustrate the gain in efficiency achieved. We also show that visual inspection of sample partial autocorrelations of the discrete latent variable helps anticipating whether blocking can be effective.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Interweaving Markov Chain Monte Carlo Strategies for Efficient Estimation of Dynamic Linear Models

In dynamic linear models (DLMs) with unknown fixed parameters, a standard Markov chain Monte Carlo (MCMC) sampling strategy is to alternate sampling of latent states conditional on fixed parameters and sampling of fixed parameters conditional on latent states. In some regions of the parameter space, this standard data augmentation (DA) algorithm can be inefficient. To improve efficiency, we app...

متن کامل

Sequentially-Allocated Merge-Split Sampler for Conjugate and Nonconjugate Dirichlet Process Mixture Models

This paper proposes a new efficient merge-split sampler for both conjugate and nonconjugate Dirichlet process mixture (DPM) models. These Bayesian nonparametric models are usually fit usingMarkov chain Monte Carlo (MCMC) or sequential importance sampling (SIS). The latest generation of Gibbs and Gibbs-like samplers for both conjugate and nonconjugate DPM models effectively update the model para...

متن کامل

Alternatives to Large Var, Varma and Multivariate Stochastic Volatility Models

In this paper, our proposal is to combine univariate ARMA models to produce a variant of the VARMA model that is much more easily implementable and does not involve certain complications. The original model is reduced to a series of univariate problems and a copula – like term (a mixture-of-normals densities) is introduced to handle dependence. Since the univariate problems are easy to handle b...

متن کامل

MCMC and Naive Parallel Gibbs Sampling

In this scribe, we are going to review the Parallel Monte Carlo Markov Chain (MCMC) method. First, we will recap of MCMC methods, particularly the Metropolis-Hasting and Gibbs Sampling algorithms. Then we will show the drawbacks of these classical MCMC methods as well as the Naive Parallel Gibbs Sampling approach. Finally, we will come up with the Sequential Monte Carlo and Parallel Inference f...

متن کامل

Markov Chain Monte Carlo Methods and the Label Switching Problem in Bayesian Mixture Modeling

In the past ten years there has been a dramatic increase of interest in the Bayesian analysis of finite mixture models. This is primarily because of the emergence of Markov chain Monte Carlo (MCMC) methods. While MCMC provides a convenient way to draw inference from complicated statistical models, there are many, perhaps underappreciated, problems associated with the MCMC analysis of mixtures. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Statistics and Computing

دوره 24  شماره 

صفحات  -

تاریخ انتشار 2014